Maximal Input Reduction of Sequential Netlists via Synergistic Reparameterization and Localization Strategies
نویسندگان
چکیده
Automatic formal verification techniques generally require exponential resources with respect to the number of primary inputs of a netlist. In this paper, we present several fully-automated techniques to enable maximal input reductions of sequential netlists. First, we present a novel min-cut based localization refinement scheme for yielding a safely overapproximated netlist with minimal input count. Second, we present a novel form of reparameterization: as a trace-equivalence preserving structural abstraction, which provably renders a netlist with input count at most a constant factor of register count. In contrast to prior research in reparameterization to offset input growth during symbolic simulation, we are the first to explore this technique as a structural transformation for sequential netlists, enabling its benefits to general verification flows. In particular, we detail the synergy between these input-reducing abstractions, and with other transformations such as retiming which – as with traditional localization approaches – risks substantially increasing input count as a byproduct of its register reductions. Experiments confirm that the complementary reduction strategy enabled by our techniques is necessary for iteratively reducing large problems while keeping both proof-fatal design size metrics – register count and input count – within reasonable limits, ultimately enabling an efficient automated solution.
منابع مشابه
Differential Impact of Sequential and Simultaneous Input Enhancement on Iranian EFL Learners’ Intake
This study set out to explore whether different input enhancement tasks as implicit instruction techniques had any significant impact on the intake of causative verbs in English as a foreign language among Iranian EFL learners. For this purpose, three intact classes consisting of 75 male and female intermediate L2 learners were randomly divided into three conditions: simultaneous grammar consci...
متن کاملSequential updating of multimodal hydrogeologic parameter fields using localization and clustering techniques
[1] Estimated parameter distributions in groundwater models may contain significant uncertainties because of data insufficiency. Therefore, adaptive uncertainty reduction strategies are needed to continuously improve model accuracy by fusing new observations. In recent years, various ensemble Kalman filters have been introduced as viable tools for updating high-dimensional model parameters. How...
متن کاملComplex Parallel-Series Reduction
Silvaco’s Guardian LVS tool compares two circuits that are defined by their netlists. The comparison is based strictly on the topological structure of these circuits. Topologically equivalent netlists are considered different, even if they are functionally equivalent. There are several techniques available for designing the same functionality by means of topologically different netlists. While ...
متن کاملScheming in the SMEFT . . . and a reparameterization invariance
We explain a reparameterization invariance in the Standard Model Effective Field Theory present when considering ψ̄ψ → ψ̄ψ scatterings (with ψ a fermion) and how this leads to unconstrained combinations of Wilson coefficients in global data analyses restricted to these measurements. We develop a {m̂W , m̂Z , ĜF } input parameter scheme and compare results to the case when an input parameter set {α̂,...
متن کاملDistance Dependent Localization Approach in Oil Reservoir History Matching: A Comparative Study
To perform any economic management of a petroleum reservoir in real time, a predictable and/or updateable model of reservoir along with uncertainty estimation ability is required. One relatively recent method is a sequential Monte Carlo implementation of the Kalman filter: the Ensemble Kalman Filter (EnKF). The EnKF not only estimate uncertain parameters but also provide a recursive estimat...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2005